Structured Variational Distributions in VIBES
نویسندگان
چکیده
Variational methods are becoming increasingly popular for the approximate solution of complex probabilistic models in machine learning, computer vision, information retrieval and many other fields. Unfortunately, for every new application it is necessary first to derive the specific forms of the variational update equations for the particular probabilistic model being used, and then to implement these equations in applicationspecific software. Each of these steps is both time consuming and error prone. We have therefore recently developed a general purpose inference engine called VIBES [1] (‘Variational Inference for Bayesian Networks’) which allows a wide variety of probabilistic models to be implemented and solved variationally without recourse to coding. New models are specified as a directed acyclic graph using an interface analogous to a drawing package, and VIBES then automatically generates and solves the variational equations. The original version of VIBES assumed a fully factorized variational posterior distribution. In this paper we present an extension of VIBES in which the variational posterior distribution corresponds to a sub-graph of the full probabilistic model. Such structured distributions can produce much closer approximations to the true posterior distribution. We illustrate this approach using an example based on Bayesian hidden Markov models.
منابع مشابه
VIBES: A Variational Inference Engine for Bayesian Networks
In recent years variational methods have become a popular tool for approximate inference and learning in a wide variety of probabilistic models. For each new application, however, it is currently necessary first to derive the variational update equations, and then to implement them in application-specific code. Each of these steps is both time consuming and error prone. In this paper we describ...
متن کاملVariational Message Passing
Bayesian inference is now widely established as one of the principal foundations for machine learning. In practice, exact inference is rarely possible, and so a variety of approximation techniques have been developed, one of the most widely used being a deterministic framework called variational inference. In this paper we introduce Variational Message Passing (VMP), a general purpose algorithm...
متن کاملVariational Structured Stochastic Network
High dimensional sequential data exhibits complex structure, a successful generative model for such data must involve highly dependent, structured variables. Thus it is desired or even necessary to model correlations and dependencies between the multiple input, output variables and latent variables in such scenario. To achieve this goal, we introduce Variational Structured Stochastic Network(VS...
متن کاملAn inexact alternating direction method with SQP regularization for the structured variational inequalities
In this paper, we propose an inexact alternating direction method with square quadratic proximal (SQP) regularization for the structured variational inequalities. The predictor is obtained via solving SQP system approximately under significantly relaxed accuracy criterion and the new iterate is computed directly by an explicit formula derived from the original SQP method. Under appropriat...
متن کاملCopula variational inference
We develop a general variational inference method that preserves dependency among the latent variables. Our method uses copulas to augment the families of distributions used inmean-field and structured approximations. Copulas model the dependency that is not captured by the original variational distribution, and thus the augmented variational family guarantees better approximations to the poste...
متن کامل